A Weighted Average of Sparse Representations is Better than the Sparsest One Alone

نویسندگان

  • Michael Elad
  • Irad Yavneh
چکیده

Cleaning of noise from signals is a classical and long-studied problem in signal processing. Algorithms for this task necessarily rely on an a-priori knowledge about the signal characteristics, along with information about the noise properties. For signals that admit sparse representations over a known dictionary, a commonly used denoising technique is to seek the sparsest representation that synthesizes a signal close enough to the corrupted one. As this problem is too complex in general, approximation methods, such as greedy pursuit algorithms, are often employed. In this line of reasoning, we are led to believe that detection of the sparsest representation is key in the success of the denoising goal. Does this mean that other competitive and slightly inferior sparse representations are meaningless? Suppose we are served with a group of competing sparse representations, each claiming to explain the signal differently. Can those be fused somehow to lead to a better result? Surprisingly, the answer to this question is positive; merging these representations can form a more accurate, yet dense, estimate of the original signal even when the latter is known to be sparse. In this paper we demonstrate this behavior, propose a practical way to generate such a collection of representations by randomizing the Orthogonal Matching Pursuit (OMP) algorithm, and produce a clear analytical justification for the superiority of the associated Randomized OMP (RandOMP) algorithm. We show that while the Maximum a-posterior Probability (MAP) estimator aims to find and use the sparsest representation, the Minimum Mean-Squared-Error (MMSE) estimator leads to a fusion of representations to form its result. Thus, working with an appropriate mixture of candidate representations, we are surpassing the MAP and tending towards the MMSE estimate, and thereby getting a far more accurate estimation, especially at medium and low SNR. ∗This research was supported by the Center for Security Science and Technology – Technion.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Representations Are Most Likely to Be the Sparsest Possible

Given a signal S ∈ RN and a full rank matrix D ∈ RN×L with N < L, we define the signal’s overcomplete representations as all α ∈ RL satisfying S = Dα. Among all the possible solutions, we have special interest in the sparsest one – the one minimizing ‖α‖0. Previous work has established that a representation is unique if it is sparse enough, requiring ‖α‖0 < Spark(D)/2. The measure Spark(D) stan...

متن کامل

Recovering non-negative and combined sparse representations

The non-negative solution to an underdetermined linear system can be uniquely recovered sometimes, even without imposing any additional sparsity constraints. In this paper, we derive conditions under which a unique non-negative solution for such a system can exist, based on the theory of polytopes. Furthermore, we develop the paradigm of combined sparse representations, where only a part of the...

متن کامل

Iterative Weighted Non-smooth Non-negative Matrix Factorization for Face Recognition

Non-negative Matrix Factorization (NMF) is a part-based image representation method. It comes from the intuitive idea that entire face image can be constructed by combining several parts. In this paper, we propose a framework for face recognition by finding localized, part-based representations, denoted “Iterative weighted non-smooth non-negative matrix factorization” (IWNS-NMF). A new cost fun...

متن کامل

MMSE Approximation for Denoising Using Several Sparse Representations

Cleaning of noise from signals is a classical and long-studied problem in signal processing. For signals that admit sparse representations over a known dictionary, MAP-based denoising seeks the sparsest representation that synthesizes a signal close to the corrupted one. While this task is NP-hard, it can usually be approximated quite well by a greedy method, such as the Orthogonal Matching Pur...

متن کامل

A New Computational Method for the Sparsest Solutions to Systems of Linear Equations

The connection between the sparsest solution to an underdetermined system of linear equations and the weighted l1-minimization problem is established in this paper. We show that seeking the sparsest solution to a linear system can be transformed to searching for the densest slack variable of the dual problem of weighted l1-minimization with all possible choices of nonnegative weights. Motivated...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008